Yes and no...
We are referring to these vehicles as self driving cars. This is not cruise control. Recognizing that these are systems under development and not mature technologies I don't expect them to be perfect. I realize they will never be perfect. But to say that the driver in an autonomous car should be or is expected to be as attentive as one doing actual driving sounds like some release that lawyers would make somebody sign knowing full well that it won't happen.
All that said, Uber deployed a car with a negligently designed software. How can the first lines of code not be
"If Distance < 1.1*Stopping_distance
Mode = Stop"
Else....
Yes, I'm oversimplifying because and adjacent lane may be available, etc. but the computer should NEVER hit something (well, unless something falls off the overpass right in front of the car). If it makes an assumption that the object in question is a car that will accelerate, that's fine... but what if two seconds into the six second time to impact it still hasn't accelerated? Time to lift? Brush the brakes? Notify the driver when there is still time to do something? ABSO DUCKING LUTELY!!! Now you are going 30 and still have four seconds (less distance but lower speed. Great, any non negligent algorithm just bought some time) STILL not accelerating? Slow down more. Change lanes if one is available. Can't do that? STOP!!!
Like I said before, this sounds like a situation that could have happened with any slow moving ambiguous shape. Did Uber consider slow moving construction equipment? A lady pushing a stroller? A disabled car being pushed by good samaritans? How many examples do you need? There are enough real world, every day situations that seem like it could have fit this that Uber was negligent in having a "I don't really know what it is, I'd better slow down" mode.
|
(
In response to this post by Beerman)
Posted: 08/30/2018 at 2:10PM